94 research outputs found

    Mass and density estimates contribute to perceived heaviness with weights that depend on the densities’ reliability

    Get PDF
    0.96). Density information contributed for 14%, 21% and 29% to heaviness, when vision was strongly impaired, mildly impaired or not impaired, respectively. Overall, the results highly corroborate our model, which appears to be promising as unifying framework for a number of findings on the size-weight illusion

    Roughness and spatial density judgments on visual and haptic textures using virtual reality

    No full text
    The purpose of this study is to investigate multimodal visual-haptic texture perception for which we used virtual reality techniques. Participants judged a broad range of textures according to their roughness and their spatial density under visual, haptic and visual-haptic exploration conditions. Participants were well able to differentiate between the different textures both by using the roughness and the spatial density judgment. When provided with visualhaptic textures, subjects performance increased (for both judgments) indicating sensory combination of visual and haptic texture information. Most interestingly, performance for density and roughness judgments did not differ significantly, indicating that these estimates are highly correlated. This may be due to the fact that our textures were generated in virtual reality using a haptic pointforce display (PHANToM). In conclusion, it seems that the roughness and spatial density estimate were based on the same physical parameters given the display technology used

    Target Search and Inspection Strategies in Haptic Search

    Get PDF
    Haptic search is a common everyday task, usually consisting of two processes: target search and target analysis. During target search we need to know where our fingers are in space, remember the already completed path and the outline of the remaining space. During target analysis we need to understand whether the detected potential target is the desired one. Here we characterized dynamics of exploratory movements in these two processes. In our experiments participants searched for a particular configuration of symbols on a rectangular tactile display. We observed that participants preferentially moved the hand parallel to the edges of the tactile display during target search, which possibly eased orientation within the search space. After a potential target was detected by any of the fingers, there was higher probability that subsequent exploration was performed by the index or the middle finger. At the same time, these fingers dramatically slowed down. Being in contact with the potential target, the index and the middle finger moved within a smaller area than the other fingers, which rather seemed to move away to leave them space. These results suggest that the middle and the index finger are specialized for fine analysis in haptic search

    Deep neural network model of haptic saliency

    Get PDF
    Haptic exploration usually involves stereotypical systematic movements that are adapted to the task. Here we tested whether exploration movements are also driven by physical stimulus features. We designed haptic stimuli, whose surface relief varied locally in spatial frequency, height, orientation, and anisotropy. In Experiment 1, participants subsequently explored two stimuli in order to decide whether they were same or different. We trained a variational autoencoder to predict the spatial distribution of touch duration from the surface relief of the haptic stimuli. The model successfully predicted where participants touched the stimuli. It could also predict participants’ touch distribution from the stimulus’ surface relief when tested with two new groups of participants, who performed a different task (Exp. 2) or explored different stimuli (Exp. 3). We further generated a large number of virtual surface reliefs (uniformly expressing a certain combination of features) and correlated the model’s responses with stimulus properties to understand the model’s preferences in order to infer which stimulus features were preferentially touched by participants. Our results indicate that haptic exploratory behavior is to some extent driven by the physical features of the stimuli, with e.g. edge-like structures, vertical and horizontal patterns, and rough regions being explored in more detail

    Auditory modulation of tactile taps perception

    No full text
    We tested whether the tactile perception of sequences of taps delivered on the index fingertip can be modulated by sequences of auditory beeps. In the first experiment, the tactile and auditory sequences were always presented simultaneously, and were structurally either similar or dissimilar. In the second experiment, the auditory and tactile sequences were always structurally similar but not always presented simultaneously. When structurally similar and presented simultaneously, the auditory sequences significantly modulated tactile taps perception. This automatic combination of “redundant-like” tactile and auditory signals likely constitutes an optimization process taking advantage of multimodal redundancy for perceptual estimates

    A hierarchical sensorimotor control framework for human-in-the-loop robotic hands.

    Get PDF
    Human manual dexterity relies critically on touch. Robotic and prosthetic hands are much less dexterous and make little use of the many tactile sensors available. We propose a framework modeled on the hierarchical sensorimotor controllers of the nervous system to link sensing to action in human-in-the-loop, haptically enabled, artificial hands

    Integration and disruption effects of shape and texture in haptic search

    Get PDF
    In a search task, where one has to search for the presence of a target among distractors, the target is sometimes easily found, whereas in other searches it is much harder to find. The performance in a search task is influenced by the identity of the target, the identity of the distractors and the differences between the two. In this study, these factors were manipulated by varying the target and distractors in shape (cube or sphere) and roughness (rough or smooth) in a haptic search task. Participants had to grasp a bundle of items and determine as fast as possible whether a predefined target was present or not. It was found that roughness and edges were relatively salient features and the search for the presence of these features was faster than for their absence. If the task was easy, the addition of these features could also disrupt performance, even if they were irrelevant for the search task. Another important finding was that the search for a target that differed in two properties from the distractors was faster than a task with only a single property difference, although this was only found if the two target properties were non-salient. This means that shape and texture can be effectively integrated. Finally, it was found that edges are more beneficial to a search task than disrupting, whereas for roughness this was the other way round

    Contact Force and Scanning Velocity during Active Roughness Perception

    Get PDF
    Haptic perception is bidirectionally related to exploratory movements, which means that exploration influences perception, but perception also influences exploration. We can optimize or change exploratory movements according to the perception and/or the task, consciously or unconsciously. This paper presents a psychophysical experiment on active roughness perception to investigate movement changes as the haptic task changes. Exerted normal force and scanning velocity are measured in different perceptual tasks (discrimination or identification) using rough and smooth stimuli. The results show that humans use a greater variation in contact force for the smooth stimuli than for the rough stimuli. Moreover, they use higher scanning velocities and shorter break times between stimuli in the discrimination task than in the identification task. Thus, in roughness perception humans spontaneously use different strategies that seem effective for the perceptual task and the stimuli. A control task, in which the participants just explore the stimuli without any perceptual objective, shows that humans use a smaller contact force and a lower scanning velocity for the rough stimuli than for the smooth stimuli. Possibly, these strategies are related to aversiveness while exploring stimuli

    Using curvature information in haptic shape perception of 3D objects

    Get PDF
    Are humans able to perceive the circularity of a cylinder that is grasped by the hand? This study presents the findings of an experiment in which cylinders with a circular cross-section had to be distinguished from cylinders with an elliptical cross-section. For comparison, the ability to distinguish a square cuboid from a rectangular cuboid was also investigated. Both elliptical and rectangular shapes can be characterized by the aspect ratio, but elliptical shapes also contain curvature information. We found that an elliptical shape with an aspect ratio of only 1.03 could be distinguished from a circular shape both in static and dynamic touch. However, for a rectangular shape, the aspect ratio needed to be about 1.11 for dynamic touch and 1.15 for static touch in order to be discernible from a square shape. We conclude that curvature information can be employed in a reliable and efficient manner in the perception of 3D shapes by touch
    • 

    corecore